18 research outputs found

    The VODKA sensor: a bio-inspired hyperacute optical position sensing device

    Get PDF
    International audienceWe have designed and built a simple optical sensor called Vibrating Optical Device for the Kontrol of Autonomous robots (VODKA), that was inspired by the "tremor" eye movements observed in many vertebrate and invertebrate animals. In the initial version presented here, the sensor relies on the repetitive micro-translation of a pair of photoreceptors set behind a small lens, and on the processing designed to locate a target from the two photoreceptor signals. The VODKA sensor, in which retinal micro-scanning movements are performed via a small piezo-bender actuator driven at a frequency of 40Hz, was found to be able to locate a contrasting edge with an outstandingly high resolution 900-fold greater than its static resolution (which is constrained by the interreceptor angle), regardless of the scanning law imposed on the retina. Hyperacuity is thus obtained at a very low cost, thus opening new vistas for the accurate visuo-motor control of robotic platforms. As an example, the sensor was mounted onto a miniature aerial robot that became able to track a moving target accurately by exploiting the robot's uncontrolled random vibrations as the source of its ocular microscanning movement. The simplicity, small size, low mass and low power consumption of this optical sensor make it highly suitable for many applications in the fields of metrology, astronomy, robotics, automotive, and aerospace engineering. The basic operating principle may also shed new light on the whys and wherefores of the tremor eye movements occurring in both animals and humans

    A sighted aerial robot with fast gaze and heading stabilization

    No full text
    International audienceAutonomous guidance of Micro-Air Vehicles (MAVs) in unknown environments is a challenging task because these artificial creatures have small aeromechanical time constants, which make them prone to be disturbed by gusts of wind. Flying insects are subject to quite similar kinds of disturbances, yet they navigate swiftly and deftly. Flying insects display highperformance visuo-motor control systems that have stood the test of time. They can therefore teach us how vision can be used for immediate and vital actions. We built a 50-gram tethered aerial demonstrator, called OSCAR II, which manages to keep its gaze steadily fixating a target (a dark edge), in spite of nasty thumps that we deliberately gave to its body with a custom-made "slapping machine". The robot's agile yaw reactions are based on: - a mechanical decoupling of the eye from the body - an active coupling of the robot's heading with its gaze - a Visual Fixation Reflex (VFR) - a Vestibulo-Ocular Reflex (VOR) - an accurate and fast actuator (Voice Coil Motor, VCM) The actuator is a 2.4-gram voice coil motor that is able to rotate the eye with a rise time as small as 12ms, that is, much shorter than the rise time of human oculo-motor saccades. In connection with a micro-rate gyro, this actuator endows the robot with a high performance "vestibulo ocular reflex" that keeps the gaze locked onto the target whatever perturbations in yaw affect the robot's body. Whenever the robot is destabilized (e.g., by a slap applied on one side), the gaze keeps fixating the target, while being the reference to which the robot's heading is servoed. It then takes the robot only 0:6s to realign its heading with its gaze

    Steering by Gazing: An Efficient Biomimetic Control Strategy for Visually-guided Micro-Air Vehicles

    No full text
    International audienceOSCAR 2 is a twin-engine aerial demonstrator equipped with a monocular visual system, which manages to keep its gaze and its heading steadily fixed on a target (a dark edge or a bar) in spite of the severe random perturbations applied to its body via a ducted fan. The tethered robot stabilizes its gaze on the basis of two Oculomotor Reflexes (ORs) inspired by studies on animals: - a Visual Fixation Reflex (VFR) - a Vestibulo-ocular Reflex (VOR) One of the key features of this robot is the fact that the eye is decoupled mechanically from the body about the vertical (yaw) axis. To meet the conflicting requirements of high accuracy and fast ocular responses, a miniature (2.4-gram) Voice Coil Motor (VCM) was used, which enables the eye to make a change of orientation within an unusually short rise time (19ms). The robot, which was equipped with a high bandwidth (7Hz) "Vestibulo-Ocular Reflex (VOR)" based on an inertial micro-rate gyro, is capable of accurate visual fixation as long as there is light. The robot is also able to pursue a moving target in the presence of erratic gusts of wind. Here we present the two interdependent control schemes driving the eye in the robot and the robot in space without any knowledge of the robot's angular position. This "steering by gazing" control strategy implemented on this lightweight (100-gram) miniature aerial robot demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Capteurs optiques minimalistes & réflexes oculomoteurs biomimétiques. Application à la robotique aérienne

    No full text
    Visual navigation in autonomous robots is usually based on video-cameras using several hundred thousands of pixels with sequential reading. Real time processing of such incoming data flows require major computing resources that would be hard to embed on a micro aerial vehicle of several grams or tens of grams. There already exist, however, lots of flying agents whose navigation performance in unknown environments is remarkable, even though they operate on a quite different basis. Birds and insects, in particular, show a unique ability to avoid obstacles and pursue preys or conspecifics. This amazing ability is due to their unique perception of the environment. Insects with their low cognitive abilities perceive their environment quite efficiently thanks to minimalist sensors. Some insects, like flies, improve their perception of the environment by stabilizing their visual system through de-coupling their head from the body and using an inertial reflex, similar to the mammalian “vestibulo-ocular reflex”. Stabilization of the “visual platform” is beneficial in that it simplifies the subsequent visual processing and enables efficient navigational strategies to be implemented. The first part of this work, dedicated to “visual sensors”, focuses on an elementary eye composed of only two photoreceptors (two pixels). We first improved the performance of a bio-inspired angular speed sensor, and revisited the working principle of the OSCAR sensor, both previously built at our laboratory. We then developed and constructed a new visual sensor, called VODKA, which allows the angular position of a contrasting edge or bar to be localized with utmost accuracy. In the second part, dedicated to visuo-inertial reflexes, we developed a micro aerial robot, called OSCAR II. Equipped with our visual sensors and an inertial sensor, OSCAR II, which weighs only 100 grams, is able to maintain its gaze locked onto a stationary target, and to pursue a moving target in yaw, even in the presence of strong aerial disturbances. With its added ability to perform eye saccades, OSCAR II bodes well for tomorrow's micro-aerial vehicles, whose heading will follow the gaze.La navigation visuelle des robots mobiles s'appuie traditionnellement sur des imageurs de type « caméra », dotés de plusieurs centaines de milliers de pixels lus séquentiellement. Le traitement de tels flux d'images nécessite une puissance de calcul qu'il serait difficile d'embarquer à bord d'un micro-aéronef de quelques grammes ou dizaines de grammes. Il existe pourtant déjà quelques agents aériens dont les performances de navigation en milieu inconnu sont remarquables, et qui pourtant fonctionnent de toute autre façon. Les oiseaux et les insectes, en particulier, montrent une capacité inégalée à éviter les obstacles et à poursuivre leurs proies ou leurs congénères. Cette capacité étonnante découle de leur perception particulière de l'environnement. Si les insectes, aux faibles capacités cognitives, perçoivent leur environnement de manière si efficace, c'est grâce aux capteurs minimalistes qu'ils embarquent. Certains insectes comme la mouche améliorent encore leur perception de l'environnement en stabilisant leur système visuel avec à un découplage tête-corps associé à un réflexe inertiel, équivalent au réflexe vestibulo-oculaire des mammifères. Cette stabilisation de la « plate-forme visuelle » permet de simplifier les traitements visuels subséquents et de mettre en œuvre des stratégies efficaces de navigation. Toute la première partie (« capteurs visuels ») de ce travail prend appui délibérément sur un œil élémentaire composé de seulement deux photorécepteurs (deux pixels). Nous avons d'abord amélioré les performances d'un capteur de vitesse angulaire bio-inspiré et revu le principe du capteur OSCAR, tous deux construits précédemment au laboratoire. Puis, nous avons développé et construit un nouveau type de capteur visuel, appelé VODKA, qui localise de manière ultrafine la position angulaire d'une cible visuelle. Dans la seconde partie (« réflexes visuo-inertiels »), nous avons développé un robot aérien miniature, appelé OSCAR II. Equipé de nos capteurs visuels et d'un réflexe « vestibulo-oculaire », OSCAR II, qui ne pèse que 100 grammes, est capable non seulement de fixer du regard une cible visuelle stationnaire, mais aussi de la poursuivre en lacet si elle vient à se déplacer, et ce même lors de fortes perturbations aérodynamiques. Avec sa capacité additionnelle de faire des saccades oculaires, OSCAR II préfigure les micro-véhicules aériens de demain, qui se dirigeront là où portera leur regard

    Method and device for measuring the angular position of a rectilinear contrasting edge of an object, and system for fixation and tracking a target comprising at least one such contrasting edge

    No full text
    The present invention relates to a method for measuring the angular position of a contrasting edge of an object having a luminance transition zone which is substantially rectilinear in a given direction and separates two regions of different luminances. This method comprises the functional steps of: - carrying out, in a transverse direction different from the given direction, an amplitude modulation of the signals delivered by a first and a second optical sensor; and - calculating an output signal (Y out (t)) starting from the signals delivered by the first and the second optical sensors (D 1 , D 2 ) as a function of the angular position of the luminance transition zone. Device and set of devices for measuring the angular position of a contrasting edge and steering aid system for fixation and tracking a target comprising at least one such contrasting edge

    Method and device for measuring the angular velocity of a luminance transition zone and steering aid system for fixation and tracking a target comprising at least one such luminance transition zone

    No full text
    The present invention relates to a method for measuring a time delay between two output signals measured by two adjacent optical sensors, the time delay resulting from a movement of an object moving in a given direction. The method comprises the steps: - carrying out a spatial filtering of the signals Ph 1 and Ph 2 delivered by a first and a second optical sensor; - calculating a first order derivative and carrying out a temporal low-pass filtering of the signals; - calculating the second order derivative of the signals; - measuring the delay between the signals; and is characterized in that the step of measuring the delay is a feedback loop based on an estimated delay between the temporally filtered signals. Device and set of devices for measuring the angular velocity of a luminance transition zone and steering aid system for fixation and tracking a target comprising at least one such luminance transition zone

    On Chip Rapid Control Prototyping for DC motor

    No full text
    This paper proposes a method for Rapid Control Prototyping (RCP) targeting microcontrollers. The methodology relies on a Matlab/Simulink interface which makes the target configuration and coding easier. Developing low level embedded code is bypassed by a high-level implementation which is straightforward for control system engineers. This article is intended for students, engineers or researchers looking to validate the effectiveness of their control algorithms on industrial targets. The design procedure is illustrated by testing various speed and current feedback loop on a DC motor

    On Chip Rapid Control Prototyping for Education

    No full text
    International audienceIn this note various test benches for Rapid Control Prototyping (RCP) targeting microcontrollers are presented. The methodology relies on a Matlab/Simulink interface which makes the target configuration and coding easier. Developing low level embedded code is bypassed by a high-level implementation which is straightforward for control system engineers. These benches are intended for students, engineers or researchers looking to validate the effectiveness of their control algorithms on low cost industrial targets
    corecore